Consistency of Sequence Classification with Entropic Priors

نویسندگان

  • Francesco A. N. Palmieri
  • Domenico Ciuonzo
چکیده

Entropic priors, recently revisited within the context of theoretical physics, were originally introduced for image processing and for general statistical inference. Entropic priors seem to represent a very promising approach to “objective” prior determination when such information is not available. The attention has been mostly limited to continuous parameter spaces and our focus in this work is on the application of the entropic prior idea to Bayesian inference with discrete classes in signal processing problems. Unfortunately, it is well known that entropic priors, when applied to sequences, may lead to excessive spreading of the entropy as the number of samples grows. In this paper we show that the spreading of the entropy may be tolerated if the posterior probabilities remain consistent. We derive a condition based on conditional entropies and KL-divergences for posterior consistency using the Asymptotic Equipartition Property (AEP). Furthermore, we show that entropic priors can be modified to force posterior consistency by adding a constraint to joint entropy maximization. Simulations on the application of entropic priors to a coin flipping experiment are included.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Data Fusion with Entropic Priors

In classification problems, lack of knowledge of the prior distribution may make the application of Bayes’ rule inadequate. Uniform or arbitrary priors may often provide classification answers that, even in simple examples, may end up contradicting our common sense about the problem. Entropic priors, via application of the maximum entropy principle, seem to provide a much better answer and can ...

متن کامل

Objective priors from maximum entropy in data classification

Lack of knowledge of the prior distribution in classification problems that operate on small data sets may make the application of Bayes’ rule questionable. Uniform or arbitrary priors may provide classification answers that, even in simple examples, may end up contradicting our common sense about the problem. Entropic priors (EPs), via application of the maximum entropy (ME) principle, seem to...

متن کامل

Entropic Priors for Discrete Probabilistic Networks and for Mixtures of Gaussians Models

The ongoing unprecedented exponential explosion of available computing power, has radically transformed the methods of statistical inference. What used to be a small minority of statisticians advocating for the use of priors and a strict adherence to bayes theorem, it is now becoming the norm across disciplines. The evolutionary direction is now clear. The trend is towards more realistic, flexi...

متن کامل

Recursive multiple-priors

This paper axiomatizes an intertemporal version of multiple-priors utility. A central axiom is dynamic consistency, which leads to a recursive structure for utility, to ‘rectangular’ sets of priors and to prior-by-prior Bayesian updating as the updating rule for such sets of priors. It is argued that dynamic consistency is intuitive in a wide range of situations and that the model is consistent...

متن کامل

Approximate Maximum A Posteriori Inference with Entropic Priors

In certain applications it is useful to fit multinomial distributions to observed data with a penalty term that encourages sparsity. For example, in probabilistic latent audio source decomposition one may wish to encode the assumption that only a few latent sources are active at any given time. The standard heuristic of applying an L1 penalty is not an option when fitting the parameters to a mu...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2011